AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Japanese Masked Language Modeling

# Japanese Masked Language Modeling

Deberta V2 Base Japanese
A Japanese DeBERTa V2 base model pretrained on Japanese Wikipedia, CC-100, and OSCAR corpora, suitable for masked language modeling and downstream task fine-tuning.
Large Language Model Transformers Japanese
D
ku-nlp
38.93k
29
Roberta Large Japanese With Auto Jumanpp
A large Japanese RoBERTa model pretrained on Japanese Wikipedia and the Japanese portion of CC-100, supporting automatic Juman++ tokenization
Large Language Model Transformers Japanese
R
nlp-waseda
139
4
Deberta Base Japanese Wikipedia
DeBERTa(V2) model pretrained on Japanese Wikipedia and Aozora Bunko texts, suitable for Japanese text processing tasks
Large Language Model Transformers Japanese
D
KoichiYasuoka
32
2
Albert Base Japanese V1 With Japanese Tokenizer
MIT
This is a Japanese-pretrained ALBERT model that uses BertJapaneseTokenizer as its tokenizer, making Japanese text processing more convenient.
Large Language Model Transformers Japanese
A
ken11
44
3
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase